Linear Programming Relaxations and Belief Propagation - An Empirical Study

نویسندگان

  • Chen Yanover
  • Talya Meltzer
  • Yair Weiss
چکیده

The problem of finding the most probable (MAP) configuration in graphical models comes up in a wide range of applications. In a general graphical model this problem is NP hard, but various approximate algorithms have been developed. Linear programming (LP) relaxations are a standard method in computer science for approximating combinatorial problems and have been used for finding the most probable assignment in small graphical models. However, applying this powerful method to real-world problems is extremely challenging due to the large numbers of variables and constraints in the linear program. Tree-Reweighted Belief Propagation is a promising recent algorithm for solving LP relaxations, but little is known about its running time on large problems. In this paper we compare tree-reweighted belief propagation (TRBP) and powerful generalpurpose LP solvers (CPLEX) on relaxations of real-world graphical models from the fields of computer vision and computational biology. We find that TRBP almost always finds the solution significantly faster than all the solvers in CPLEX and more importantly, TRBP can be applied to large scale problems for which the solvers in CPLEX cannot be applied. Using TRBP we can find the MAP configurations in a matter of minutes for a large range of real world problems.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Beyond Loose LP-Relaxations: Optimizing MRFs by Repairing Cycles

This paper presents a new MRF optimization algorithm, which is derived from Linear Programming and manages to go beyond current state-of-the-art techniques (such as those based on graph-cuts or belief propagation). It does so by relying on a much tighter class of LP-relaxations, called cycle-relaxations. With the help of this class of relaxations, our algorithm tries to deal with a difficulty l...

متن کامل

Tightening LP Relaxations for MAP using Message Passing

Linear Programming (LP) relaxations have become powerful tools for finding the most probable (MAP) configuration in graphical models. These relaxations can be solved efficiently using message-passing algorithms such as belief propagation and, when the relaxation is tight, provably find the MAP configuration. The standard LP relaxation is not tight enough in many real-world problems, however, an...

متن کامل

On Convex Programming Relaxations for the Permanent

In recent years, several convex programming relaxations have been proposed to estimate the permanent of a non-negative matrix, notably in the works of [GS02, Gur11, GS14]. However, the origins of these relaxations and their relationships to each other have remained somewhat mysterious. We present a conceptual framework, implicit in the belief propagation literature, to systematically arrive at ...

متن کامل

Inference in Graphical Models via Semidefinite Programming Hierarchies

Maximum A posteriori Probability (MAP) inference in graphical models amounts to solving a graph-structured combinatorial optimization problem. Popular inference algorithms such as belief propagation (BP) and generalized belief propagation (GBP) are intimately related to linear programming (LP) relaxation within the Sherali-Adams hierarchy. Despite the popularity of these algorithms, it is well ...

متن کامل

Tightness Results for Local Consistency Relaxations in Continuous MRFs

Finding the MAP assignment in graphical models is a challenging task that generally requires approximations. One popular approximation approach is to use linear programming relaxations that enforce local consistency. While these are commonly used for discrete variable models, they are much less understood for models with continuous variables. Here we define local consistency relaxations of MAP ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 7  شماره 

صفحات  -

تاریخ انتشار 2006